Minimax adaptive dimension reduction for regression

نویسنده

  • Quentin Paris
چکیده

In this paper, we address the problem of regression estimation in the context of a p-dimensional predictor when p is large. We propose a general model in which the regression function is a composite function. Our model consists in a nonlinear extension of the usual sufficient dimension reduction setting. The strategy followed for estimating the regression function is based on the estimation of a new parameter, called the reduced dimension. We adopt a minimax point of view and provide both lower and upper bounds for the optimal rates of convergence for the estimation of the regression function in the context of our model. We prove that our estimate adapts, in the minimax sense, to the unknown value d of the reduced dimension and achieves therefore fast rates of convergence when d p. Index Terms – Regression estimation, dimension reduction, minimax rates of convergence, empirical risk minimization, metric entropy. AMS 2000 Classification – 62H12, 62G08.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Supplement to: Minimax adaptive dimension reduction for regression

In this supplement, we present an oracle inequality for the performance of least-squares estimates in the context of bounded regression. The proof follows the lines devised Koltchinskii (2006). Here, focus has been put on keeping track of the dependence on some constants of the problem, which was crucial for the results of the paper Paris (2013a). Index Terms – Regression estimation, least-squa...

متن کامل

Adaptive Dimension Reduction with a Gaussian Process Prior

In nonparametric regression problems involving multiple predictors, there is typically interest in estimating the multivariate regression surface in the important predictors while discarding the unimportant ones. Our focus is on defining a Bayesian procedure that leads to the minimax optimal rate of posterior contraction (up to a log factor) adapting to the unknown dimension and anisotropic smo...

متن کامل

k-NN Regression Adapts to Local Intrinsic Dimension

Many nonparametric regressors were recently shown to converge at rates that depend only on the intrinsic dimension of data. These regressors thus escape the curse of dimension when high-dimensional data has low intrinsic dimension (e.g. a manifold). We show that k-NN regression is also adaptive to intrinsic dimension. In particular our rates are local to a query x and depend only on the way mas...

متن کامل

On the Minimax Optimality of Block Thresholded Wavelets Estimators for ?-Mixing Process

We propose a wavelet based regression function estimator for the estimation of the regression function for a sequence of ?-missing random variables with a common one-dimensional probability density function. Some asymptotic properties of the proposed estimator based on block thresholding are investigated. It is found that the estimators achieve optimal minimax convergence rates over large class...

متن کامل

Thresholding projection estimators in functional linear models

We consider the problem of estimating the regression function in functional linear regression models by proposing a new type of projection estimators which combine dimension reduction and thresholding. The introduction of a threshold rule allows to get consistency under broad assumptions as well as minimax rates of convergence under additional regularity hypotheses. We also consider the particu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Multivariate Analysis

دوره 128  شماره 

صفحات  -

تاریخ انتشار 2014